Search Results: "mlang"

18 July 2014

Mario Lang: Croudsourced accessibility: Self-made digital menus

Something straight out from the real world: Menu cards in restaurants are not nice to deal with if you are blind. It is an old problem we grow used to ignoring over time, but still something that can be quite nagging. There are a lot of psychological issues involved in this one. Of course, you can ask for the menu to be read out to you by the staff. While they usually do their best, you end up missing out on some things most of the time. First of all, depending on the current workload in the restaurant, the staff will usually try to cut some time and not read everything to you. What they usually do is to try to understand what type of meal you are interested in, and just read the choices from that category to you. While this can be considered a service in some situations (human preprocessing), there are situations were you will definitely miss a highlight on the menu that you would have liked to choose if you knew that it was there. And even if the staff decides to read the complete menu to you (which is rare), you are confronted with the 7-things-in-my-head-at-once problem. It is usually rather hard to decide amongst a list of more then 7 items, because our short-term memory is sort of limited. What the sighted restaurant goers do, is to skip back and forth between the available options, until they hit a decisive moment. True, that can take a while, but it is definitely a lot easier if you can perform "random access reads" to the list of choices yourself. However, if someone presents a substantial number of choices to you in a row, as sequential speech, you loose the random access ability. You either remember every choice from the beginning and do your choosing mentaully (if you do have extraordinary mental abilities), or you end up asking the staff to read previous items aloud again. This can work, but usually it doesn't. At some point, you do not want to bother the staff anymore, and you even start to feel stupid for asking again and again, while this is something totally normal to every sighted person, just that "they" do their "random access browsing" on their own, so "they" have no need to feel bad about how long it takes them to decide, minus the typical social pressure that arises after a a certain time for everyone, at least if you are dining in a group. In very rare cases, you happen to meet staff that is truly "awake", doing their best to not let you feel that they might be pressed on time, and really taking as much time as necessary to help you make the perfect decision. This is rare, but if it happens, it is almost a magical moment. One of these moments, where there are no "artificial" barriers between humans doing communcation. Anyway, I am drifting away. The perfect solution to this problem is to provide random access browsing of a restaurant menu with the help of digital devices. Trying to make braille menus available in all restaurants is a goal which is not realistically reachable. Menus go out of date, and need changing. And getting a physical braille copy updated and reprinted is considerably more involved as with digital media. Restaurant owners will also likely not see the benefit to rpvide a braille card for a very small circle of customers. With a digital online menu, that is a completely different story. These days, almost every blind person in at least my social circles owns an iOS (or similar) device. These devices have speech synthesis and web browsers. Of course, some restaurants especially in urban areas do already have a menu online. I have found them manually with google and friends sometimes in the past, which has already given me the ability to actually sit back, and really comfortably choose amongst the available offerings myself, without having to bother a human, and without having to feel bad about (ab)using their time. However, the case where a restaurant really has their menu online is rather rare still in the area where I am. And, it can be tedious to google for a restaurant website. Sometimes, the website itself is just marginally accessible, which makes it even more frustrating to get a relaxed dinner-experience. I have discovered a location-based solution for the restaurant-menu problem recently. Foursquare offers the ability to provide a direct link to the menu in a restaurant-entry. I figured, since all you need to do is write a single webpage where the (common) menu items are listed per restaurant, that I could begin to create restaurant menus for my favourite locations, on my own. Well, not quite, but almost. I will sometimes need help from others to get the menu digitized, but that's just a one-time piece of work I hopefully can outsource :-). Once the actual content is in my INBUX, I create a nice HTML page listing the menu in a rather speech-based browser friendly way. I have begun to do this today, with the menu of a restaurant just about 500 meters away from my apartment. Unterm goldenen Dachl now has a menu online, and the foursquare change request to publish the corresponsing URL is already pending. I don't fully understand how the Foursquare change review process works yet, but I hope the URL should be published in the upcoming days/weeks. I am using Foursquare because it is the backend of a rather popular mobile navigation App for blind people, called Blindsquare. Blindsquare lets you comfortably use Open Street Map and Foursquare data to get an overview of your surroundingds. If a food place has a menu entry listed in Foursquare, Blindsquare conveniently shows it to you and opens a browser if you tap it. So there is no need to actually search for the restaurant, you can just use the location based search of Blindsquare to discover the restaurant entry and its menu link directly from within Blindsquare. Actually, you could even find a restaurant by accident, and with a little luck, find the menu for it by location, without even knowing how the restaurant is called. Isn't that neat? Yeah, that's how it is supposed to work, that's as much independence as you can get. And, it is, as the title suggests, croudsourced accessibility. Becuase while it is nice if a restaurant owner cares to publish their menu themselves, if they haven't, you can do it yourself. Either as a user of assistive technologies, to scratch your own itch. Or as a friend of a person with a need for assistive technologies. Next time you go to lunch with your blind friend, consider making available the menu to them digitally in advance, instead of reading it. Other people will likely thank you for that, and you have actually achieved something today. And if you happne to put a menu online, make sure to submit a change request to Foursquare. Many blind people are using blindsquare these days, which makes it super-easy for them to discover the menu.

15 July 2014

Mario Lang

Mixing vinyl again The turntables have me back, after quite some long-term mixing break. I used to do straight 4-to-the-floor, mostly acid or hardtek. You can find an old mix of mine on SoundCloud. This one is actually back from 2006. But currently I am more into drum and bass. It is an interesting mixing experience, since considerably harder. Here is a small but very recent minimix. Experts in the genre might notice that I am mostly spinning stuff from BlackOutMusicNL, admittedly my favourite label right now.

5 July 2014

Mario Lang: I love my MacBookAir with Debian

In short: I love my MacBook Air. It is the best (laptop) hardware I ever owned. I have seen hardware which was much more flaky in the past. I can set the display backlight to zero via software, which saves me a lot of battery life and also offers a bit of anti-spy-acroos-my-shoulder support. WLAN and bluetooth work nicely. And I just love the form-factor and the touch-feeling of the hardware. I even had the bag I use to carry my braille display modified so that the Air just fits in. I can't say how it behaves with X11. Given how flaky accessibility with graphical desktops on Linux is, I have still not made the switch. My MacBookAir is my perfect mobile terminal, I LOVE it. I am sort of surprised about the recent rant of Paul about MacBook Hardware. It is rather funny that we perceive the same technology so radically different. And after reading the second part of his rant I am wondering if I am no longer allowed to consider myself part of the "hardcore F/OSS world", because I don't consider Apple as evil as apparently most others. Why? Well, first of all, I actually like the hardware. Secondly, you have to show me a vendor first that builds usable accessibility into their products, and I mean all their products, without any extra price-tag attached. Once the others start to consider people with disabilities, we can talk about apple-bashing again. But until then, sorry, you don't see the picture as I do. Apple was the first big company on the market to take accessibility seriously. And they are still unbeaten, at least when it comes to bells and whistles included. I can unbox and configure any Apple product sold currently completely without assisstance. With some products, you just need to know a signle keypress (tripple-press the home button for touch devices and Cmd+F5 for Mac OS/X), and with others, during initial bootup, a speech synthesizer even tells you how to enable accessibility in case you need it. And after that is enabled, I can perform the setup of the device completely on my own. I don't need help from anyone else. And after the setup is complete, I can use 95% of the functionality provided by the operating system. And I am blind, part of a very small margin group so to speak. In Debian circles, I have even heard the sentiment that we supposedly have to accept that small margin groups are ignored sometimes. Well, as long as we think that way, as long as we strictly think economically, we will never be able to go there, fully. And we will never be the universal operating system, actually. Sorry to say that, but I think there is some truth to it. So, who is evil? Scratch your own itch doesn't always work to cover everything. How do we motivate contributors to work on things they don't personally need (yet)? How can we ensure that complicated but seldomly used features stay stable and do not fall to dust just because some upstream decides to rewrite an essential subcomponent of the dependency tree? I don't know. All I know is that these issues need to be solved in an universal operating system.

25 June 2014

Mario Lang: Four new packages on the GNU Emacs Package Archive (ELPA)

I have begun to push some of the Emacs Lisp Packages I have been working on over the last years to GNU ELPA, the Emacs Lisp Package Archive. That means you can use "M-x list-packages RET" to install them in GNU Emacs 24.
OpenSound Control library In 2007, I wrote OSC server and client support for Emacs. I used it back then to communicate with SuperCollider and related software. osc.el is a rather simple package with no user visible functionality, as it only provides a library for Emacs Lisp programmers. It is probably most interesting to people wanting to remote-control (modern) sound related software from with Emacs Lisp.
Texas hold'em poker As my interest in poker has recently sparked again, one thing led to another, so I began to write a poker library for GNU Emacs. It was a very fun experience. Version 0.1 of poker.el can simulate a table of ten players. Bots do make their own decisions, although the bot code is very simple. The complete game is currently played in the minibuffer. So there is definitely room for user interface enhancements, such as a poker table mode for displaying a table in a buffer.
Weather information from weather.noaa.gov I started to write metar.el in 2007 as well, but never really finished it to a releaseable state. I use it personally rather often, but never cleaned it up for a release. This has changed. It plugs in with other GNU Emacs features that make use of your current location. In particular, "M-x sunrise-sunset" and "M-x phases-of-moon" use the same variables (calendar-latitude and calendar-longitude) to determine where you are. "M-x metar" will determine the nearest airport weather station and display the weather information provided by that station.
Chess Finally, after many many years of development separated by uncountable amounts of hiatus, chess.el is now out as version 2.0.3! For a more detailed article about chess.el, see here.

23 June 2014

Mario Lang: Slashdot did not improve

I used to read Slashdot, many many years ago. However, when they started to do more and more "Your rights online" articles, I gradually stopped to read them. Someone just sent me a link to a slashdot article: m.slashdot.org. If I try to open this with Lynx, I get the following error page:
It looks like your browser doesn't support JavaScript or it is disabled.
Please use the desktop site instead.
OK FaceBook^WSlashdot, all the geeks formerly employed by you have obviously left. And no, I am not going to read you anytime soon. BTW, the link to the "desktop site" goes here, which looses the reference to the particular story, if you notice. So your redirection site is totally useless. Thanks for nothing. Hush hush, to the ethernal internet graveyard where you belong.

13 May 2014

Mario Lang: C++14 Lambdas

The aaddition of lambdas in C++11 has greatly improved the flexibility of the language as a whole. C++14 will add a few more abilities. C++ Truths has a very nice and brief overview of what C++14 will allow. Part 1 goes from the very basics via closures to a small demonstration of partial function application. And Part 2 continues by showing recursive lambdas, overloaded lambdas, pattern matching, in-place parameter pack expansion (neat) and finally even memoisation. These code examples have been the most inspiring ones I have seen lately. If you have any interest in C++, go read them.

7 May 2014

Mario Lang: Planet bug: empty alt tags for hackergotchis

There is a strange bug in Planet Debian I am seeing since I joined. It is rather minor, but since it is an accessibility bug, I'd like to mention it here. I have written to the Planet Debian maintainers, and was told to figure it out myself. This is a pattern, accessibility is considered wishlist, apparently. And the affected people are supposed to fix it on their own. It is better if I don't say anything more about that attitude.
The Bug On Planet Debian, only some people have an alt tag for their hackergotchi, while all the configured entries look similar. There is no obvious difference in the configuration, but still, only some users here have a proper alt tag for their hackergotchi. Here is a list:
  • Dirk Eddelbuettel
  • Steve Kemp
  • Wouter Verhelst
  • Mehdi (noreply@blogger.com)
  • Andrew Pollock
  • DebConf Organizers
  • Francois Marier
  • The MirOS Project (tg@mirbsd.org)
  • Paul Tagliamonte
  • Lisandro Dami n Nicanor P rez Meyer (noreply@blogger.com)
  • Joey Hess
  • Chris Lamb
  • Mirco Bauer
  • Christine Spang
  • Guido G nther
These people/organisations currently displayed on Planet Debian have a proper alt tag for their hackergotchi. All the other members have none. In Lynx, it looks like the following:
hackergotchi for
And for those where it works, it looks like:
hackergotchi for Dirk Eddelbuettel
Strange, isn't it? If you have any idea why this might be happening, let me know, or even better, tell Planet Debian maintainers how to fix it. P.S.: Package planet-venus says it is a rewrite of Planet, and Planet can be found in Debian as well. I don't see it in unstable, maybe I am blind? Or has it been removed recently? If so, the package description of planet-venus is wrong.

Mario Lang: Accessible single-player texas hold'em poker for iOS

I have been looking for an App like this since I got my first iOS device in december 2011. Finally, it is here! A single-player (bot-driven) poker app for iOS, THETA Poker Pro, fully accessible and usable with VoiceOver. AppleVis has a review. It doesn't happen very often, but App programmers in the iOS Universe do indeed sometimes think about Accessibility support, and the APIs provided by Apple are useful enough to allow programmers to write very accessible apps. You can say about Apple whatever you want, currently, it is the company providing the best accessibility support on the market. Why? Because they made accessibility a first-class citizen of their platform(s). This is where policy helps. If you can dictate top-down that you support people with disabilities, things actually start to happen. If you have to ask, hope, and wait, like it is with free software, things do not really progress as fast as the users need it. Back to THETA Poker Pro: The default configuration is already very useable with VoiceOver. However, if you want the cards placed on the board announced to you, so that you do not have to discover them manually by touch, you can enable the "Card Announcement" item in the Options menu. You can also set message delay a bit slower, such that all messages are actually fully spoken and not cut off sometimes. With these two settings adjusted, and maybe "Animation" set to "Very fast", the game feels extremely nice. There is actually nothing I would want to change, which does not happen very often when I test a program for its accessibility. With these settings changed, game play is very smooth with VoiceOver, you basically just have to tap your cards to check, tap the deck to fold, or tap your chips to raise. Very simple, and these three "buttons" are on the bottom of the screen, so rather easy and quick to find. All other activity is automatically announced by VoiceOver. I have played a few hundred hands already with this App. It is a wonderful way to pass time. For instance, I don't like to go to my doctor, because I usually wait up to two or three hours. I had to pay her a visit on monday. While waiting, I played "a few" hands, and suddenly, I was already called in. When I came out again, I checked the time and was rather surprised that yes, I have waited two hours again, but this time, I didn't notice! :-) Special thanks go to the author(s) of this app. It is a good example of an App that was not specially made for the blind, but which feels like it was. Thanks, you've made my week!
A small rant OTOH, it makes me sad when I think about my beloved Linux platform and GUI accessibility. We are stuck since 2004 with a bit of desktop support + a half-working Firefox. During the D-Bus rewrite, quality of GUI accessibility has dropped so much that I had to take time off from linux gui accessibility to stay sane. It is back to where it was in 2006, yay, but we haven't made a lot of real progress in the last 8 years. Granted, firefox has improved, but to my taste, not enough. I still do all my email, shell work, programming and some other things on Linux of course, but I notice that I do more and more casual stuff on iOS, it is just sooo much more useable. I do almost all my surfing with mobile safari, because it just works. Firefox works sometimes, and some other times working with it feels so slow that I am actually getting angry. The scratch-your-own-itch philosophy combined with a very small margin group is poison for success. We'd need much more funding, and people working actively on this stuff as their day job, if we ever want to be competitive with existing solutions.

12 April 2014

Mario Lang: Emacs Chess

Between 2001 and 2004, John Wielgley wrote emacs-chess, a rather complete Chess library for Emacs. I found it around 2004, and was immediately hooked. Why? Because Emacs is configurable, and I was hoping that I could customize the chessboard display much more than with any other console based chess program I have ever seen. And I was right. One of the four chessboard display types is exactly what I was looking for, chess-plain.el:
  
8 tSlDjLsT 
7 XxXxXxXx 
6         
5         
4         
3         
2 pPpPpPpP 
1 RnBqKbNr 
  
  abcdefgh
This might look confusing at first, but I have to admit that I grew rather fond of this way of displaying chess positions as ASCII diagrams. In this configuration, initial letters for (mostly) German chess piece names are used for the black pieces, and English chess piece names are used for the white pieces. Uppercase is used to indicate if a piece is on a black square and braille dot 7 is used to indicate an empty black square. chess-plain is completely configurable though, so you can have more classic diagrams like this as well:
  
8 rnbqkbnr 
7 pppppppp 
6  + + + + 
5 + + + +  
4  + + + + 
3 + + + +  
2 PPPPPPPP 
1 RNBQKBNR 
  
  abcdefgh
Here, upper case letters indicate white pieces, and lower case letters black pieces. Black squares are indicated with a plus sign. However, as with many Free Software projects, Emacs Chess was rather dormant in the last 10 years. For some reason that I can not even remember right now, my interest in Emacs Chess has been reignited roughly 5 weeks ago.
Universal Chess Interface It all began when I did a casual apt-cache serch for chess engines, only to discover that a number of free chess engines had been developed and packaged for Debian in the last 10 years. In 2004 there was basically only GNUChess, Phalanx and Crafty. These days, a number of UCI based chess engines have been added, like Stockfish, Glaurung, Fruit or Toga2. So I started by learning how the new chess engine communication protocol, UCI, actually works. After a bit of playing around, I had a basic engine module for Emacs Chess that could play against Stockfish. After I had developed a thin layer for all things that UCI engines have in common (chess-uci.el), it was actually very easy to implement support for Stockfish, Glaurung and Fruit in Emacs Chess. Good, three new free engines supported.
Opening books When I learnt about the UCI protocol, I discovered that most UCI engines these days do not do their own book handling. In fact, it is sort of expected from the GUI to do opening book moves. And here one thing led to another. There is quite good documentation about the Polyglot chess opening book binary format on the net. And since I absolutely love to write binary data decoders in Emacs Lisp (don't ask, I don't know why) I immediately started to write Polyglot book handling code in Emacs Lisp, see chess-polyglot.el. It turns out that it is relatively simple and actually performs very good. Even a lookup in an opening book bigger than 100 megabytes happens more or less instantaneously, so you do not notice the time required to find moves in an opening book. Binary search is just great. And binary searching binary data in Emacs Lisp is really fun :-). So Emacs Chess can now load and use polyglot opening book files. I integrated this functionality into the common UCI engine module, so Emacs Chess, when fed with a polyglot opening book, can now choose moves from that book instead of consulting the engine to calculate a move. Very neat! Note that you can create your own opening books from PGN collections, or just download a polyglot book made by someone else.
Internet Chess Servers Later I reworked the internet chess server backend of Emacs Chess a bit (sought games are now displayed with tabulated-list-mode), and found and fixed some (rather unexpected) bugs in the way how legal moves are calculated (if we take the opponents rook, their ability to castle needs to be cleared). Emacs Chess supports two of the most well known internet chess servers. The Free Internet Chess Server (FICS) and chessclub.com (ICC).
A Chess engine written in Emacs Lisp And then I rediscovered my own little chess engine implemented in Emacs Lisp. I wrote it back in 2004, but never really finished it. After I finally found a small (but important) bug in the static position evaluation function, I was motivated enough to fix my native Emacs Lisp chess engine. I implemented quiescence search so that captue combinations are actually evaluated and not just pruned at a hard limit. This made the engine quite a bit slower, but it actually results in relatively good play. Since the thinking time went up, I implemented a small progress bar so one can actually watch what the engine is doing right now. chess-ai.el is a very small Lisp impelemtnation of a chess engine. Static evaluation, alpha beta and quiescence search included. It covers the basics so to speak. So if you don't have any of the above mentioned external engines installed, you can even play a game of Chess against Emacs directly.
Other features The feature list of Emacs Chess is rather impressive. You can not just play a game of Chess against an engine, you can also play against another human (either via ICS or directly from Emacs to Emacs), view and edit PGN files, solve chess puzzles, and much much more. Emacs Chess is really a universal chess interface for Emacs.
Emacs-chess 2.0 In 2004, John and I were already planning to get emacs-chess 2.0 out the door. Well, 10 years have passed, and both of us have forgotten about this wonderful codebase. I am trying to change this. I am in development/maintainance mode for emacs-chess again. John has also promised to find a bit of time to work on a final 2.0 release. If you are an Emacs user who knows and likes to play Chess, please give emacs-chess a whirl. If you find any problems, please file an Issue on Github, or better yet, send us a Pull Requests. There is an emacs-chess Debian package which has not been updated in a while. If you want to test the new code, be sure to grab it from GitHub directly. Once we reach a state that at least feels like stable, I am going to update the Debian package of course.

4 April 2014

Mario Lang: Accessible voting graphs

It is that time of the year again, Debian is electing the Project Leader for 2014/15. Whenever a Debian vote is in progress, I find myself rather happy to rediscover that Debian is providing text mode voting graphs. These are quite accessible to me as a braille user. It is rather unusual for me as a blind person to be able to access any graphs on the internet at all. All this has been made possible by gnuplots ability to generate text plots, and Manoj's willingness to implement it during his term as project secretary. Thanks to Gnuplot and Manoj, and thanks to the current secretary for keeping this feature, it is (at least to me) a very nice to have, and actually makes Debian rather unique. I personally don't know of any other major projects which provide text graphs. We are indeed setting a very good example here. It would be nice if other projects would adopt this as well. This is bridging the digital divide for me. For completeness sake I should probably mention that text graphs are not an universal solution for blind users. Those of us who do not use braille will probably have a very hard time extracting any meaningful information from this ASCII character salad. But to a braille user used to reading two dimensional information from the screen, it does actually work rather well. Some solutions from the 90s, when people didn't have graphical terminals readily available everywhere, are still very good accessibility workarounds. I am going to post another article about how I play chess on a computer, with newsgroup-style chessboard diagrams. Now that I think of it, these two topics are very related. There are some rather nifty solutions floating around from the good old text mode days which we need to revive before they finally get forgotten on the internet.

27 March 2014

Mario Lang: Demonstrating a common m17n a11y problem with a web audio game

There is a rather infamous multilingualisation bug in almost every screen reader I have ever used. As a german native speaker, I usually set my operating system user interface language to german. However, since I do a lot of technical work, I frequently end up on english websites. But if I temporarily switch my speech synthesis language to english, certain user interface elements and key names are not translated! So the screen reader continues to say "Eingabe" or "Leertaste", but with an english accent! I consider this behaviour hilarious, especially in 21st century. To demonstrate how hilarious it actually is, I have made a little browser audio game. Voice snippets have been contributed by my girlfriend. A natural voice is still a lot better than sampled speech synthesizers, and sounds more fun as we think. The idea is simple: Angela will announce a key name, and you are supposed to hit it on the keyboard as fast as possible. After a round of 50 guesses (time to answer will decrease with every guess) a final score will be announced. I usually don't write any JavaScript. I still managed to hack this up (with a little help from Simon) in roughly 2 hours of browsing Stackoverflow. The game is here. The code is on GitHub. If you have any problems getting this to work, please let me know or send a pull request. I'd like to get it working at least on iOS as well (after all, it is the touch device platform with best accessibility support currently), however, I could not get mobile safari to behave. So patches to enable use on touch devices are welcome. It should be simple, just add a few buttons which call the pressed function with the appropriate character as parameter. However, I tried, and something is fishy regarding window.setTimeout since it does not behave the same way as if you use it with Firefox. Oh, and my favourite key name is "Runt Glamour Zoo" :-) Now, this was funny. But it is also very sad. If you play this game, remember that blind and visually impaired computer users really have to cope with such accented pronounciations on a daily basis. We grew used to it, but if I take a step backward, it's really unacceptable that we still have to cope with this daily. And this is the simple part of the bug, since it is only dealing with key labels, which are internal to the system. It gets much worse if you want your screen reader to automatically switch to the correct synthesis language. Language detection is apparently not good enough to be implemented, at least that's what they tell me. And if the document (HTML) specifies a language, it's wrong in more than 50% of the cases. This is a disaster. Come to think of it, that is probably a reason why I prefer braille over speech. With braille, this problem goes away completely.

26 March 2014

Mario Lang: Accessible DNS hosting with LuaDNS

I used to host my internet infrastructure at Hurricane Electric. It all started in october 1998 with POP3/SMTP, HTTP and DNS. In the coming years, I began to host all services except DNS on my own. But I kept using he.net for its DNS management interface. It was dead simple, and therefore accessible. All they had was a basic textarea with BIND alike configuration in it. I could log in to their admin interface and change my DNS records as desired. A few years ago, they auto-upgraded my account to their new shiny DNS panel, which, surprise surprise, is no longer accessible with a simple text browser. After a bit of bitching with support, they ended up downgrading my account back to the old functionality, so I was happy again. However, as you might guess, last time I needed to change a DNS record, I found that the DNS panel has been ugpraded yet again and is again no longer accessible to me. So it was time to leave the sinking ship. But I needed to find an accessible DNS hosting service. Not an easy task, given that everyone seems to do more or less the same thing these days.
Git to the rescue! After a bit of web searching it became apparent that most offers these days are not what I want. I want a simple interface without any danger of accessibility issues. In most cases, you can not test the DNS management interface before signup. After a few dead ends, I took a step back and said to myself: "So, what is it that I am actually looking for? If this were a wishlist item, how would I like my workflow to be?" And the answer came immediately: "I want my zonefiles in a git repo!" So I decided to turn my search upside down and search exactly for that. And guess what, I found exactly what I was looking for: LuaDNS. LuaDNS has 5 nameservers in Europe, Asia and North America. As the name implies, it offers a way to write your zone files with Lua. This can be quite helpful for programmatically generating zones. However, it also supports BIND alike zone files, which is what I use. The idea is simple: You create a Git repository on GitHub or BitBucket and let LuaDNS know where it is. A web hook can be setup to automatically trigger zone rebuilds once you push to your repository. So all my accessibility problems around DNS hosting are suddenly completely gone. Once I edited/commited my zone files and pushed to my repository, LuaDNS will automatically pull from the repository and update my zones.
  • I can edit my zones with my editor of choice without having to go through the web.
  • I have history for my DNS changes.
  • I can revert changes easily.
  • Changes can have descriptive commit log entries.
  • All the usual advantages of Git.
And I will never have to fight with an inaccessible web interface again. That said, LuaDNS has a web interface for administering account settings. It works very nice with Lynx. I hope they keep it that way.
Current LuaDNS limitations There are two things I don't particularily like about LuaDNS currently:
  • There are currently no AAAA records for their DNS servers. This is apparently being worked on and is supposed to be fixed somewhere around april 2014. Note that AAAA records are perfectly supported by LuaDNS zone files, it is just that none of the DNS servers they provide to you offers any AAAA records yet. So IPv6-only hosts might have trouble to reach your site if they don't use an IPv4 enabled recursor (a rather rare setup I guess).
  • Due to the way BIND alike zone files work in LuaDNS (SOA records are autogenerated), you can currently not sign your zones. I've been told DNSSEC is on the list of things to work on at LuaDNS, so I am looking forward to see what they will implement.
The team is friendly and was very fast to react on a question via email. Looks good, I'll stay. Now that I think of it, this article might be considered an answer to Steve Kemps question what would you pay for: I'd pay for a VCS based DNS hosting solution that allows me to use DNSSEC, if its web interface were kept clean and simple and therefore accessible. However, I don't mind a free account for low volume usage at all. Especially if that makes it easy to test the service and make sure it works as expected.

17 March 2014

Mario Lang: grub-efi-amd64 on a MacBook Air

When I installed Debian on my MacBookAir4,1 in 2012, there was no way to do it without manual intervention yet. I followed a tutorial on how to boot GRUB from the EFI Boot option in the MacBook EFI Menu. I did not want to fiddle with rEFIt, and I didn't want to boot GRUB by default. When I want to boot Linux, I press the Option key during power up and select EFI Boot from the Apple boot menu. Unfortunately, I neglected to collect notes about how I did it manually. However, Linux 3.2 is getting a bit old, so I finally wanted to replace my manual boot configuration with something handled by the package system. In case you don't have /boot/efi in /etc/fstab yet, you need to mount /dev/sda1 on /boot/efi for the following to work. The documentation I found on the net suggested to just reinstall grub-efi-amd64 and everything should work. That is not quite true. When I do
# apt-get install --reinstall grub-efi-amd64
Nothing changes in /boot/efi. I sort of expected that /boot/efi/EFI/debian would be created, and the EFI image should be placed in there. However, that did not happen. Why is that? It turns out that when I installed grub-efi-amd64 manually in 2012, I created /boot/efi/EFI/boot/bootx64.efi which is the EFI fallback location, and apparently exactly what I want on this MacBook which does not support multiple boot options. Matthew Garrett posted an interesting article called Booting with EFI which sheds light on this issue, go and read it. Looking at /var/lib/dpkg/info/grub-efi-amd64.postinst revealed that /boot/efi/EFI/debian needs to be created manually first. If this directory does not exist, grub-efi-amd64 basically does nothing on reinstall. Running grub-install will actually create a new EFI image. However, it is being created in the wrong place for this machine.
# grub-install --target=x86_64-efi
does the trick. Now /boot/efi/EFI/debian/grubx64.efi gets created. However, since I don't want to make GRUB the default, there is yet another manual step to do:
# cp /boot/efi/EFI/debian/grubx64.efi /boot/efi/EFI/boot/bootx64.efi
Now I can select EFI Boot after pressing the Option key during startup. GRUB is loaded and Linux 3.13 gets booted. Strike! Looking more closely reveals that there is actually a way to tell grub-install that it should install to the fallback location directly. The --removable option does that. For the faint of heart, what does grub-install actually do on an EFI system? It does not directly write to the disk, therefore it does not need a device specified. It looks for files in /boot/efi and assumes the EFI partition is mounted there. So for my use case, the correct way to upgrade to a current GRUB EFI image should have been:
# grub-install --removable --target=x86_64-efi
Meanwhile I've been made aware of Bug#708430. I guess it would be nice to have an option in /etc/default/grub which would indicate that installation to the fallback location is desired. While this is a rather ugly hack to work around a stupid limitation, it is still what I'd like on this MacBook. At least since I don't have a triplle boot situation. Fallback location works fine with just two OSes coexisting.

13 March 2014

Mario Lang: Living in RAM

I switched to SSD on my home server a few years ago, at a time when SSH was still new and sort of hyped. However, I immediately knew that this type of technology is for me: it is fast and rather silent. SSD storage has this air of being fragile though. You are supposed to not wear it out too much. It's not like in the good old days where you would write to your hard disk without any extra thought.
tmpfs Before I switched to SSD I was already using tmpfs to store /tmp in RAM, for performance reasons mostly. /etc/default/tmpfs:
RAMTMP=yes
Since RAM is relatively cheap these days I have lots of it in my home server.
/var/cache/apt/archives When I switched to SSD, I extended this concept by mounting /var/cache/apt/archives via tmpfs. This is to save disk I/O for package updates. And for a machine running sid which is more or less regularily updated, this saves quite some useless writes to disk. Roughly 100MB per update.
/tmp/src Nowadays, I have almost all of my source code checkouts and build-trees in /tmp/src. myrepos makes this very easy to handle. Simply clone all repositories you are interested in, and remember their details with mr register. After a reboot, all you need to do is:
$ mkdir /tmp/src
$ cd !$
$ mr up
Your repository locations have (likely) been recorded in ~/.mrconfig. Of course there is the danger of accidentally loosing data due to a power outage. However, I feel this leads to a rather clean workflow. Changes need to be pushed anyway. So since you can not trust your local repository to really persist, you are forced to push your changes regularily, which is a good thing! Besides, I have a UPS at home which gives me roughly one hour of backup power. This was enough to sustain every power outage I have witnessed in the past years. Typically, outages are rather short lived in my area, somewhere between a few seconds and 5 minutes. It is rather rare that electricity goes missing for more than half an hour. The only downside of this approach is that from time to time, you will use more bandwidth by re-cloning repositories after a reboot. But really, who needs reboots on Linux? Last time I had half a year uptime, and only rebooted because I wanted to jump from linux 3.8 to 3.13. This approach only works on long-running machines though. It is probably not very useful for a laptop.

4 March 2014

Mario Lang: Personal mail server

I used to use Fetchmail and Gnus with its mail splitting capabilites and nnml backend to handle my private mail setup for many years. This worked pretty fine since I am used to accessing my home server through SSH. I have Gnus running inside of a GNU screen session. So I can check mail from remote by using a SSH terminal. This works fine as long as I have a good SSH terminal on a desktop or laptop computer. However, it does not work very well on tablets or smaller mobile devices. Additionally, mail splitting has become a performance burden over the years. I do not want to wait for Gnus to sort incoming mail into different folders while checking for new mail. That is something which should have been done in the background already, and thats what we are going to cover with the setup described below. So I had to change my simple setup to accomodate for the new trends in mobile computing. The obvious core of such a setup is an IMAP server which receives and stores your mail such that different clients can access it. So the time of my Gnus nnml storage are definitely over. Mail is no longer stored in and by my mail client.
Dovecot While there are several IMAP server solutions out there, I find Dovecot fits my needs quite nicely. I've decided to store my mail in Maildir format in ~/Maildir. I prefer storing data like that in the home directory to avoid having to backup separate files from /var/. Maildir also features some index files which should help performance in the long run. Incoming mail will solely be delivered by fetchmail and should be checked for spam. While I can probably configure Exim to run SpamAssassin on mails before delivering them to Dovecot, there is a much more elegant solution: the dovecot local delivery agent (LDA). /usr/lib/dovecot/deliver takes mail from standard input and performs Sieve filtering and updates the mail indexes. We will call this executable more or less directly from fetchmail. Incoming mail from mailing lists will be sorted into different folders using Sieve. Dovecot needs to be told to enable the sieve plugin and to create new folders on demand. /etc/dovecot/local.conf:
disable_plaintext_auth = yes
mail_location = maildir:~/Maildir
lda_mailbox_autocreate = yes
lda_mailbox_autosubscribe = yes
protocol lda  
  mail_plugins = sieve
 
Sieve scripts are actually quite intuitive once you have a template to start from. ~/.dovecot.sieve:
require "fileinto";
if exists "X-Spam-Flag"  
  # Store spam tagged by SpamAssassin into dedicated Spam folder
  if header :contains "X-Spam-Flag" "YES"  
    fileinto "Spam";
   
  elsif exists "X-Cron-Env"  
  # Store mails from Cron daemon in dedicated folder
  fileinto "cron";
  elsif exists "List-Id"  
  # File list-mail into dedicated folders, matching on List-Id
  if header :contains "List-Id" "boost-users.lists.boost.org"  
    fileinto "boost-users";
    elsif header :contains "List-Id" "brltty.mielke.cc"  
    fileinto "brltty";
    elsif header :contains "List-Id" "debian-accessibility.lists.debian.org"  
    fileinto "debian-accessibility";
    elsif header :contains "List-Id" "debian-devel-announce.lists.debian.org"  
    fileinto "debian-devel-announce";
    elsif header :contains "List-Id" "debian-devel.lists.debian.org"  
    fileinto "debian-devel";
    elsif header :contains "List-Id" "spirit-general.lists.sourceforge.net"  
    fileinto "spirit-general";
   
  # ...
 
SpamAssassin Since I want automatic classification of spam messages, I use SpamAssassin. Just install spamassasin and enable spamd in /etc/default/spamassassin:
ENABLE=1
We will use spamc in the Fetchmail configuration.
Fetchmail My ~/.fetchmailrc is a straightforward list of some mailboxes to fetch mail from. I use the mda directive to skip the MTA and send mail through SpamAssassin and deliver it to Dovecot via its LDA mechanism. ~/.fetchmailrc:
set daemon 1200 # Poll at 10 minute intervals
poll blind.guru protocol IMAP: ssl;
# ... add more sources here ...
mda "/usr/bin/spamc -u %T -e /usr/lib/dovecot/deliver -d %T"
To avoid spreading access information in too many configuration files I am using the ability of Fetchmail to use netrc to retrieve account passwords. ~/.netrc:
machine blind.guru login mlang password <hidden>
Gnus I am using Gnus to read mail, newsgroups and RSS feeds since many years now. It would be quite a mouthful to explain all the customizations I am using by now. But there is one very important bit in the context of this article: How to access the IMAP server? In my setup, Emacs and therefore Gnus is running on the same machine as the IMAP server. So I can avoid authentication at all. This configuration will avoid unnecessary password prompts or caching. In ~/.emacs or ~/.gnus:
(setq gnus-secondary-select-methods '((nnimap "localhost"
                                       (nnimap-stream shell)))
      nnimap-shell-program "/usr/lib/dovecot/imap")
With this you should be able to subscribe to your IMAP folders from within Gnus with ease. Sorting incoming mails into folders is now performed by the IMAP server through Sieve scripts. Instead of changing Gnus' configuration I now edit ~/.dovecot.sieve when I subscribe to a new mailing list. If you add a new Sieve rule for a mailing list and the associated folder does not exist yet, Dovecot will autocreate it, very convenient.
Mobile devices Now all that is left is a way for your mobile devices to read and eventually send mail. This is very much dependant on your network setup, so I am not going to go into any detail here. If you are accessing your mail setup from a tablet in your local network you might get away without tinkering with your router configuration. If you want to read/send mail on the go you need some way to get to your external IP. Either it is stable enough or you need some dynamic DNS service. You will definitely want to forward IMAP and maybe SMTP ports from your router to your home server. If you don't have an existing SMTP server for your mobile device that accepts your outgoing mails you can also set one up yourself and deliver outgoing mails from your mobile device to the world with Exim or qmail. I am personally using Exim since I am going with the default MTA for Debian. Configuring Exim to take mail from iOS devices was as simple as enabling an appropriate authentication method and adding an account to /etc/exim4/passwd. I have to admit though that I don't particularily like Exim's configuration files. That is why I ended up using dovecot's LDA in the first place.

25 February 2014

Mario Lang: I am a programming language

I love learning about programming languages. But this one really took me by surprise. Yes, apparently there is a BrainFuck alike two-dimensional esoteric programming language called MarioLANG. And GitHub even has an implementation written in Ruby. I should really allocate a bit of spare time to write at least something in myself.

19 February 2014

Mario Lang: Enhancing Hub to support GitHub social features

In my article Contributing on GitHub I recently explained that I can not use certain social features of GitHub on their website directly. I also explained that I make great use of Hub to circumvent some accessibility issues I have with GitHub. Well, it turns out that adding star, unstar, follow and unfollow to Hub was actually a lot easier then I thought. It is long ago (probably 10 years) that I touched Ruby code the last time. However, since I do learn a lot from examples, I was actually able to add what I needed just by looking at surrounding code. Well, I had to lookup how you add an element to the end of an array (push) but that was basically it. You can have a look at the final outcome in Hub pull request #490. The code resides in the social branch of my hub fork. If you already have an OAuth token for hub, you need to login to GitHub and edit your token to add the user:follow scope if you want to use the follow and unfollow commands. Interestingly, this type of operation is accessible enough to still work with Lynx! I hope this gets merged, I don't particularily like to maintain my own forks of quite obvious functionality. Anyway, problem solved, mostly. All I am missing now is the ability to delete a repository. A dangerous feature, so probably controversial. Anyway, something for another day.

18 February 2014

Mario Lang: Conntributing on GitHub

I really like to read code written by different people. I've always been the type of learner that can cover a lot of ground by looking at examples. A few of the programming tricks I acquired over time have definitely been learnt by reading programs written by other people. There are of course cases of disagrement with coding style (lets put that very mildly), so not every project is fun to read. But I tend to find projects I like enough to actually look at their source code. Now, if you get into this habit, there is also the occasional spotting of a bug. Many are really cosmetic, like typos in comments or documentation. Others are maybe more serious, who knows what can happen if you read a few lines of code written by someone you have never met in person. This is one of the original arguments of open source and free software. It basically goes like: "People will read your code and report problems." I know that happens, but I also know that there are cases of people finding relatively minor things during a small code review, which they don't report, or which get lost in some developers INBOX. I remember waiting months to see memory corruption fixes for a particular project I submitted by email to actually appear in the development repository. In some cases you really need to track your patches and prod someone to remind them about your contribution if need be. True, some of these issues are probably minor enough to not really hurt if they get lost, but I have a feel that a lot of minor things also sort of add up. This is where GitHub comes into play. We have had project hosting services since at least a decade now, but none of them (to my experience) have ever made it so easy for all involved parties to submit and merge changes into projects. Pull requests are just amazing. Well, at least for me, they are since a few months. I have a sort of mixed hate-love relationship with GitHub when it comes to a topic that is very important to me, Accessibility. The last project hosting site I really liked for their simple web interface (read, still usable with Lynx) was Google Code. That was in the days when Google still seemed to care about such things. These days, their newer services are next to unusable to me (with the tools I prefer). SourceForge was always a catastrophy. Many things were cumbersome to do, because the site is so loaded with links to all sorts of things I never need. However, at the time I was using it, all the things I really needed did work. GitHub is a different beast, in a different era of the internet. While I can read project pages and a few basic things perfectly fine, some parts of the GitHub site are simply unusable for no particular reason. For instance, I can edit (some of my) preferences and even save them. But I can not star a project, nor can I follow other developers. Invoking these links gives me a 404, which seems to indicate that some JavaScript (read, client side code execution) is required which my browser does not perform. However, I really wonder why. Because something like "I want to star this project" will eventually have to end up as a request on the server anyway, so it seems technically unnecessary to rely on Javascript for such an operation to succeed. Luckily, these days, hip projects do have an API. An GitHub is no exception. I want to emphasis that an API alone is not an accessibility fix, since you rarely have the time to implement a clinet for a random API, just to get access to a service you want to use. Sometimes you do, and an API is a great enabler in such situations, but a reference client with a rich feature-set is much better. I have to admit that I haven't evaluated any of the possible alternatives, but one client I found for GitHub's API did really boost my productivity in the last year. I am talking about Hub. It is a simple wrapper around Git that adds a few and enhances some other git commands. Most importantly, hub fork will fork the current checkout on GitHub, and hub pull-request allows you to quickly (and I really mean quickly!) open a pull request for some commits you've just done to your fork of some project. This is very helpful, and very accessible. While it does not solve all my accessibility problems with GitHub, it at least enables me to take part in the GitHub workflow. I can easily fork projects and submit pull reuqests for changes I did. And that is what I did, at least a bit, in roughly the past year. I am not sure exactly, but I think it was spring when someone made me aware of Hub. One approach would be to check my GitHub activity graph, but I probably don't have to mention that it is not accessible. Here is a list of some of my contributions to open source projects on GitHub since I discovered a good client for the GitHub API: To summarize: I love GitHub for what it does to the free software community. I think the simplicity (and transparency) of contributions is to the benefit of free and open source software. However, I also hate GitHub sometimes when I need to ask a friend to do relatively trivial changes for me. What I actually want, is a command-line client like Hub that makes most or even all functionality provided on the website available. True, some things like graphs are not immediately easy to adapt. But I really shouldn't need to use a modern full-fledged web browser to tell GitHub that I am interested in the activities of a certain user. Or delete a fork. Yes, thats right. I can use Hub to fork a project, but there is no way to delete it again. Sort of strange for casual small-scale contributions. I guess the workflow there is: fork, commit, pull-request, wait for merge, delete. I don't want to have all sorts of forks on GitHub that I actually don't actively use/maintain. So once a pull request was merged, and I don't have any other things I am working on right now, I actually want to get rid of the fork again. It is easy enough to do a fork should I ever need it again.

Mario Lang: Contributing on GitHub

I really like to read code written by different people. I've always been the type of learner that can cover a lot of ground by looking at examples. A few of the programming tricks I acquired over time have definitely been learnt by reading programs written by other people. There are of course cases of disagrement with coding style (lets put that very mildly), so not every project is fun to read. But I tend to find projects I like enough to actually look at their source code. Now, if you get into this habit, there is also the occasional spotting of a bug. Many are really cosmetic, like typos in comments or documentation. Others are maybe more serious, who knows what can happen if you read a few lines of code written by someone you have never met in person. This is one of the original arguments of open source and free software. It basically goes like: "People will read your code and report problems." I know that happens, but I also know that there are cases of people finding relatively minor things during a small code review, which they don't report, or which get lost in some developers INBOX. I remember waiting months to see memory corruption fixes for a particular project I submitted by email to actually appear in the development repository. In some cases you really need to track your patches and prod someone to remind them about your contribution if need be. True, some of these issues are probably minor enough to not really hurt if they get lost, but I have a feel that a lot of minor things also sort of add up. This is where GitHub comes into play. We have had project hosting services since at least a decade now, but none of them (to my experience) have ever made it so easy for all involved parties to submit and merge changes into projects. Pull requests are just amazing. Well, at least for me, they are since a few months. I have a sort of mixed hate-love relationship with GitHub when it comes to a topic that is very important to me, Accessibility. The last project hosting site I really liked for their simple web interface (read, still usable with Lynx) was Google Code. That was in the days when Google still seemed to care about such things. These days, their newer services are next to unusable to me (with the tools I prefer). SourceForge was always a catastrophy. Many things were cumbersome to do, because the site is so loaded with links to all sorts of things I never need. However, at the time I was using it, all the things I really needed did work. GitHub is a different beast, in a different era of the internet. While I can read project pages and a few basic things perfectly fine, some parts of the GitHub site are simply unusable for no particular reason. For instance, I can edit (some of my) preferences and even save them. But I can not star a project, nor can I follow other developers. Invoking these links gives me a 404, which seems to indicate that some JavaScript (read, client side code execution) is required which my browser does not perform. However, I really wonder why. Because something like "I want to star this project" will eventually have to end up as a request on the server anyway, so it seems technically unnecessary to rely on Javascript for such an operation to succeed. Luckily, these days, hip projects do have an API. An GitHub is no exception. I want to emphasis that an API alone is not an accessibility fix, since you rarely have the time to implement a clinet for a random API, just to get access to a service you want to use. Sometimes you do, and an API is a great enabler in such situations, but a reference client with a rich feature-set is much better. I have to admit that I haven't evaluated any of the possible alternatives, but one client I found for GitHub's API did really boost my productivity in the last year. I am talking about Hub. It is a simple wrapper around Git that adds a few and enhances some other git commands. Most importantly, hub fork will fork the current checkout on GitHub, and hub pull-request allows you to quickly (and I really mean quickly!) open a pull request for some commits you've just done to your fork of some project. This is very helpful, and very accessible. While it does not solve all my accessibility problems with GitHub, it at least enables me to take part in the GitHub workflow. I can easily fork projects and submit pull reuqests for changes I did. And that is what I did, at least a bit, in roughly the past year. I am not sure exactly, but I think it was spring when someone made me aware of Hub. One approach would be to check my GitHub activity graph, but I probably don't have to mention that it is not accessible. Here is a list of some of my contributions to open source projects on GitHub since I discovered a good client for the GitHub API: To summarize: I love GitHub for what it does to the free software community. I think the simplicity (and transparency) of contributions is to the benefit of free and open source software. However, I also hate GitHub sometimes when I need to ask a friend to do relatively trivial changes for me. What I actually want, is a command-line client like Hub that makes most or even all functionality provided on the website available. True, some things like graphs are not immediately easy to adapt. But I really shouldn't need to use a modern full-fledged web browser to tell GitHub that I am interested in the activities of a certain user. Or delete a fork. Yes, thats right. I can use Hub to fork a project, but there is no way to delete it again. Sort of strange for casual small-scale contributions. I guess the workflow there is: fork, commit, pull-request, wait for merge, delete. I don't want to have all sorts of forks on GitHub that I actually don't actively use/maintain. So once a pull request was merged, and I don't have any other things I am working on right now, I actually want to get rid of the fork again. It is easy enough to do a fork should I ever need it again.

14 February 2014

Mario Lang: Type erasure

Andrzej's C++ blog has a nice series on type erasure. I found it interesting to read and have learnt some things from it. For instance, I vaguely guessed, but did not fully realize, that std::function<> is of course a performance hit, since the way how std::function<> is implemented makes it impossible for the compiler to do anything useful with that function call. Inlining is totally out of the question, since we have just erased the type information. You might say, "of course!", but for me it was sort of a revelation. The series is in four parts: part I, part II, part III, part IV.

Next.

Previous.